101Innovationslogo

101Innovationslogo

De UBVU wil haar dienstverlening graag zo dicht mogelijk laten aansluiten bij het wetenschappelijke communicatieproces van de onderzoekers van VU en VUmc.

Het wetenschappelijke communicatieproces is volop in beweging. Dit komt door de inzet van digitale middelen binnen de onderzoeksfases: Discovery, Analysis, Writing, Publication, Outreach en Assessment.

Het project doet onderzoek naar naar wat het gebruik is van deze digitale middelen binnen het VU en VUmc, en sluit hierbij aan bij een internationaal onderzoek waardoor resultaten vergeleken kunnen worden met andere landen.

De onderstaande resultaten zijn bedoeld als gespreksstof met de faculteiten om de dienstverlening te verbeteren, vernieuwen en te veranderen, zodat ze beter aansluit bij de fases van de onderzoekspraktijk van de onderzoekers.

1 Aanpak

Er zijn onderzoeksvragen door de vak- en thema specialisten opgesteld. Dit zijn de vragen die nodig zijn om het gesprek te voeren. Voor elke vraag is een raamwerk gemaakt waarbinnen het antwoord vanuit de enquete resultaten kan worden gegeven. Vanuit het oogpunt tijd, is gekozen om alleen de vragen te beantwoorden met de hoogste urgentie.

De vragen moeten antwoord geven voor twee vraag-categoriën: 1. Het toolgebruik binnen de VU in haar geheel 2. Het toolgebruik binnen Disciplines

Met name de laatste vraag-categorie is interessant voor vak- en themaspecialisten, waar ze inzicht krijgen in het tool-gebruik binnen de discipline die ze vertegenwoordigen.

2 Data gathering

Met deze enquete hebben we meegelift bij een bestaand onderzoek van Kramer, B. and J. Bosman, Innovations in scholarly communication - global survey on research tool usage [version 1; referees: awaiting peer review]. F1000Research 2016, 5:692 (doi: 10.12688/f1000research.8414.1)

We hebben een custom URL aangevraagd waardoor VU en VUmc onderzoekers in de binnengekomen data is te onderscheiden met de hash 7V4u8a. Van deze custom URL is een verkorte URL gemaakt [http://bit.ly/vu101innovations], zodat we de activiteit van de verspreiding makkelijker bij konden houden. We hebben aan de portefeuillehoudersonderzoek gevraagd deze verkorte URL door te sturen naar hun onderzoekers.

VU101webactivity

VU101webactivity

In twee e-mail acties leverde het in januari een activiteit op van 543 bezoekers, en in februari 296 bezoekers, in totaal 839 bezoekers.

De VU en VUmc hebben samen ongeveer 6000 personen wetenschappelijke staf.

3 Resultaten

De vraag-categorien komen ook terug in de nummering van de resultaten.

Elke vraag bevat een antwoord, aangevuld met diagrammen. Een vraag begint met een samenvattende uitleg en diagram en daarna volgen de sub-secties met gedetailleerdere diagrammen.

De Scholarly Communication Fases zullen gedurende het hele rapport terug komen: Discovery, Analysis, Writing, Publication, Outreach en Assessment.

VU101innovations

VU101innovations

4 Demographics

These demographics form the baseline of our study.

4.1 Survey outcomes

Number of respondents Value
World Wide 20663
OECD countries 15752
Netherlands 2041
VU and VUmc 531

## NULL

Survey Respondents Worldwide More global demographics…

The values below are within the set of VU & VUmc respondents.

Discipline (multi-choice) Value
Physical Sciences 39
Engineering and Technology 35
Life Sciences 144
Medicine 181
Social Sciences and Economics 176
Law 26
Arts & Humanities 55

## NULL
Role Value
Number of PhD’s 230
Number of PostDoc’s 70
Number of (Associate, Assistant) Professors 188
First publication year Value
before 1991 61
1991-2000 70
2001-2005 55
2006-2010 79
2011-2016 168
not published (yet) 96
Country of affiliation Value
Netherlands 519
United States 3
Germany 2
Brazil 1
DR of Congo 1
India 1
Italy 1
Latvia 1
Turkey 1

4.2 Organisation demographics VU&VUmc

Below the numbers are given for the active scientific personel on 30th of June 2016 for the VU. For VUmc (Medicine), numbers from annual report 2015 are used.

Faculty Number of scientific personnel
Godgeleerdheid 86
Geesteswetenschappen 221
Rechtsgeleerdheid 219
Sociale Wetenschappen 224
Economische Wetenschappen en Bedrijfskunde 430
Exacte Wetenschappen 390
Aard- en Levenswetenschappen 450
Gedrags- en Bewegingswetenschappen 422
Geneeskunde (VUmc) 1079
Tandheelkunde (ACTA) 251

4.3 Survey disciplines and faculty

To normalize the numbers, and calculating the representation of the the survey respontents to the faculties, the following divisions are used.

Survey Discipline Faculty Number of scientific personnel
Physical Sciences Exacte wetenschappen 195
Engineering and Technology Exacte wetenschappen 195
Life Sciences Aard- en Levenswetenschappen 450
Medicine Aard- en Levenswetenschappen AND Geneeskunde AND Tandheelkunde (ACTA) 1780
Social Sciences and Economics Sociale Wetenschappen AND Economische Wetenschappen en Bedrijfskunde 654
Law Rechtsgeleerdheid 219
Arts & Humanities Godgeleerdheid AND Geesteswetenschappen 307

6 VU&VUmc vs OECD countries

Despite the fact that the survey has responses from many different countries, we limit the analysis to the 34 OECD member states (checked 3 May 2016), as these countries are more similar to the Netherlands, and comparison is more meaningful. For example, respondents from countries with low GDP often use Zotero (free of charge), while EndNote (paid) is used more in countries with a higher GDP.

The figures below compare respondents from VU University to respondents from OECD countries. OECD respondents are indicated with solid colored bars, VU respondent bars are hashed. All data is reported in percentages, that is, a solid bar up to 80 for google Scholar in the Discovery_search graph indicates that 80 per cent of all VU respondents reported using Google Scholar for Search in the Discovery process. We report all tools per subactivity.

Overall, differences between OECD and VU repondents are not very large, but there are a few tools that stand out.

6.1 Discovery

Mendeley is used relatively often at the VU for reading and searching

6.2 Analysis

Use of SPSS as a tool for analysis is much larger at the VU than for the OECD average.

6.3 Writing

As in the Discovery phase, Mendeley users for reference management are strongly represented at the VU. The preference for Mendeley is at the expense of all other tools except Endnote. For writing, VU respondents are relatively traditional, with high usage of MS Word and low usage of Google Docs and LaTeX.

6.4 Publication

Scopus usage is relatively low. Few VU respondents use the institutional repository for archival.

6.5 Outreach

6.6 Assessment

7 Tenured vs non-tenured researchers

In this section, we report on differences in tool usage between tenured and non-tenured researchers. We consider assistant professors, associate professors and full professors as tenured faculty; PhD students and postdoctoral researchers are grouped as non-tenured.

The first set of graphs is a quick summary with the tools that show the most pronounced differences between the two groups. We calculate the difference by substracting the use in the tenured group from the use in the non-tenured group (both as percentages). The upper bars show the largest positive difference (ie, the tool is more popular among non-tenured researchers); the lower bars show the largest negative difference.

The second set of graphs below we show the most pronounced differences to the far right and left of each diagram. Here we see all tools in the survey sorted by research phase and research activity. We calculate the difference by substracting the use in the tenured group from the use in the non-tenured group (both as percentages). The bars on the far-right show the largest positive difference (ie, the tool is more popular among non-tenured researchers); the bars on the far-left show the largest negative difference (ie, the tool is more popular among tenured researchers).

7.1 Discovery

The difference in use for PubMed and table of content announcements for journals stand out as the most significant discrepancies in the Discovery phase. Although not featuring in the ‘top-2’ figures, the use of Mendely stands out when inspecting the more detailed graphs: non-tenured (generally younger) researchers use Mendeley more often in the Reading, Searching and Alert activities within the Discovery phase.

7.2 Analysis

Tool use for analysis is stronger with non-tenured researchers across the board. This holds for relatively new (and more open) tools such as R and Python, as well as for long-standing software such as Excel and MATLAB. The large difference for SPSS is no outlier. Tools for sharing analysis scripts are not very popular, and tool usage is low overall. Somewhat unexpectedly, use of the Open Science Framework is stronger for the tenured then for the non-tenured group. This could have to do with some cases where that the OSF is often used for grant applications, and that this arguably is a more important activity for tenured researchers.

7.3 Writing

The importance of Mendeley in the research workflow of non-tenured researchers is again apparent in the Writing Phase. Among this group, Mendeley is the most popular reference management software, more popular than Endnote—the most popular reference tool for tenured researchers. For the writing itself, MS Word is by far the most popular tool among both groups.

7.4 Publication

In general, tenured researchers use more tools in the Publication phase; probably they simply publish more. This makes it difficult to interpret these figures properly. A few tools stand out. First, PubMed is relatively popular for archival of publications for non-tenured researchers, although in absolute terms ResearchGate is the most popular repository for both groups. GitHub is used mostly by non-tenured researchers as a repository for scripts and software code.

7.5 Outreach

Tenured researchers seem to spend more effort on their research profile, as tool use in this phase is higher for that group. ResearchGate is popular among both groups. Although to a lesser extent (differences are less pronounced), tenured researchers also use more tools for outrecach to a broader public.

7.6 Assessment

The difference is use of Web of Science indicators for impact assessment is striking: about 55% of tenured researchers indicate using the tool, versus appraximately 20% of non-tenured reseachers. Altmetrics and the PLoS metrics are not very popular (yet) in comparison, and are used by both groups, although slightly more by non-tenured researchers.

8 Open Access and Open Science

8.1 Open Intention for Tenures vs Non-tenures

Despite of people are not always familiar with the tools that they can use for engaging in Open Science, they overall have the intent to be supportive in favor of Open Acces an Open Science.

Apart from the multiple-choise questions, there was one open question:

“What do you think will be the most important development in scholarly communication in the coming years?”

This question got a tremendous response (N=341; 64%) , which give a better view on their positon on their worries and hopes on the future of conducting science. The answers shows what the directions are academic support could invest in.

What we did with the free text answers is; First we labeled all answers if they express a threat or opportunity, then we filtered to all senior researchers (N=188; 35%) to get a managable sample, and last we labelled the answers on which research phase and activity the answer was related to. This way we hope to get a perception of their worries and hopes for the future of Scholarly Communication.

8.1.1 Worries

The professors expressed their worries (N=6, 1%) for the following, that The number of journals and locations where publications can be found will increase, and if journals disappear in cases where knowledge is democratised in a wikipedia fashion, it is hard to trust or distinguish quality of a scientific paper or fact. (Related to Discovery-Search and Assessment-Review) In the realm of communicating through journals as the only option for publishing, we see that on one hand there are worries about open access journals who ask to high of a price to publish, and on the other hand there are worries about the quality of open access journals, not only to read, but also to publish. (Related to Discovery-Access and Publication-Submit) In other words, the current perception is that there lacks a system that can create a market to increase the quality and trustworthiness of open access journals and reduce the price for publishing. (DOAJ.org and QOAM.eu are means for making quality of open access journals transparent.)

  • The Wikipedification of science: everybody’s opinion will be counted equally. With devastating effect.
  • First Open Access pest will continue to explosively grow; Then, we will all be shocked about the mess, discuss how this was possible and try (but fail) to find those to blame; Return to pay for quality system instead of pay for publishing-no-matter-how-bad
  • As always in history, tendencies are like swings or pendula, going to and fro. The tendency of recent past to open up, to tear down walls and borders is approaching its apex. We will witness a return movement towards closing, protecting, guarding. Just as we see this happening in the world with respect to national boundaries. Academia is slowly already moving from a free space towards distinct ‘gated communities’ so to speak. Of course that tendency eventually will reach its apex as well. After which a renewed swing towards openness will take place. Probably not in my lifetime, however.
  • Increase in number of journals and publication locations
  • Unfortunately, more formulas to quantify science Output/Impact across very diverse fields.
  • Development of prices for journals and for publications in journals. Prices are too high, and resistance is growing.

8.1.2 Hopes

The hopes the professors (N=53, 10%) expressed that the future of scholarly communication should bring were the following for the different research phases:

8.1.2.1 Discovery

As a researcher one needs to know what is the latest on your field, but do not need to be swamped in irrelevant information, and once a relevant paper is found there must be the ability to read it without restriction. The ultimate hope is that there is no need to read through all the papers but to get alerted to only the latest and necessary information for your current research domain, from the complete scientific corpus, open accessible for knowledge aggregators, and human provenance readers. But in terms of more traditional scholarly communication, there is a hope for more clarity on the quality of the scientific content, so that the few papers that remaining in the scientific corpus are more high in quality, or at least can be filtered on quality standards. When found there is the hope that one day there will be no need to pay and gain free access to all that scientific content. Once there is access to a paper there is a hope one can engage with more interaction to the content - see for example readcube.com, utopiadocs.com - and interaction with other readers - see for example hypothes.is -

8.1.2.1.1 sample of answers:
  • Search
  • reduction of output (less is more)
  • tools that aggregate scattered knowledge in papers;
  • More clarity about the quality of journals.
  • Researchgate as primary tool
  • Access
  • open access online, platform for sharing between researcher without paying
  • scientific articles as ‘free’ pdf in the ‘cloud’
  • Researchgate as primary tool
  • Read
  • Projects like Reader which allow much more interaction with papers
  • Researchgate as primary tool
  • tools that aggregate scattered knowledge in papers;

8.1.2.2 Analysis

When designing doing the operation of the research, the hopes are that there will be more emphasis on replication by working in concurrence, preregistering protocols and experiment designs. See for example osf.io, protocols.io, scientificprotocols

Also the hope is that working in concurrence forces to deliver improved descriptions of methods and increase the quality of the data. The hope is expressed that this will be required for any valid publication.

Research is so diverse and heterogeneous, there is no one tool to nail the job, but the scientific endeavor needs different tools for different research groups in different phases. To excel in science, the hope is that the university will apply ‘super segmentation’, to give researchers access to the best tools for the right task, at the right moment. The 101 research tools data base could be a start for this.

8.1.2.2.1 sample of answers:
  • Sharing
  • More emphasis on replication
  • More concurrence
  • It should be making Science reproducible again: better description of the methods, better quality data and structured obtained knowledge should be required for any publication.
  • Publication based on preregistration of protocols/experiments
  • Analyzing
  • One important concept in marketing is the idea of “super segmentation”. Lots of excellent tools are coming available, but they are differentials relevant for different people, and different people need to integrate them. It will become a challenge how to integrate that most optimally. My suspicion is that the university- slow as it is- won’t be able to handle this development. So either the universities adapt and let people operate in start-up like enterprises within the university, or good researchers break away from universities

8.1.2.3 Writing

The hopes for scholarly communication in the writing phase is that researchers hope for more responsive communication, without losing credit for their contributions. Writing could be done in smaller iterations, where the feedback helps in building up towards completing a milestone. This is like contributions in the open source software industry where working with nightly-build and milestone releases is common practice.

One example of a platform that are built with responsive communication, version control, collaborative editing for mixed teams with LaTeX and rich-text writers, and commenting during drafting or after publishing, is overleaf.com

Also the hopes are to get support in making complex material easier to fathom by visualisations and animations.

8.1.2.3.1 sample of answers:
  • Write
  • online communication tools
  • Open Access but also sites like Academia, where you just put your text online and receive feedback, even without a publisher in between.
  • Supporting written material by animation.

8.1.2.4 Publication

Hopes are that for deciding where to submit a paper other indicators than the journal impact factor will be considered like: author’s right to open access the publication, the publication limit or APC budgets, quality factors of a journal - see qoam.eu and doaj.org - , data appliance policy, or even the necessity of publishing with a journal as intermediate can be questioned.

For publishing text hopes are up for open access, removing gatekeepers and enabling text- and datamining, but there is more to that. Hopes are also up for faster publication rates of smaller research deliverables in other outlets than journals, by disseminating and claiming ideas in pre-print-proofs, preregistering the research design or hypothesis, store the findings with the processed/intermediate data along with the code/tool/app/container including the raw data used for processing, publish the final results with links to the data, and evaluate results and data in peer-review channels. Some examples of platforms that are built for “publish now, review later” - f1000research.com - , preregistering and sharing intermediate results - osf.io -,

For hopes related to storing data, have a look at re3data.org for a complete overview of data repositories worldwide where you can filter on subject, quality seals, persistency policy, reuse licenses and more.

Of course big-data lovers have hopes that peer reviewed results can be fuel to create knowlets and nanopublications, to create big-data knowledge graphs to improve scientific discovery methods, and emphasize research opportunities in networks across researchers. - nanopub.org -

8.1.2.4.1 sample of answers:
  • Submit
  • The manner in which publishers contract with institutions and authors about access and publishing.
  • Journals will lose dominance in favor of direct online publication, review and discussion
  • A cap on the # of papers published per researcher, to promote quality over quantity
  • More clarity about the quality of journals.
  • Elsevier will loose marketshare
  • Publishing
  • the further growth of open access journals
  • open access without print, fast publication rate (2 to 3 weeks following submission, see eLife)
  • Open access of publicly funded research.
  • detach scientists from traditional journal format and publishing
  • open access, other outlets than journals
  • Open access and blogging
  • Open access; Open data; preregistration; bypassing the regular journals
  • The need to advertise your own work and not get lost in the mass of papers that are published every day. In addition, the desire to make data available through repositories (such as a website, data dryad, or figshare)
  • Grey literature. Pre-print proofs of articles shared on the web, online discussions. So I think that official peer reviewed channels will remain important for evaluations, but less for the dissemination of ideas.
  • Open access + Publishing of raw and processed data and intermediate results
  • nanopublications and shareable knowlets
  • Researchgate as primary tool
  • Archiving
  • Self-archiving, Open Access
  • Datasharing
  • Sharing data
  • open acces publication and data sharing
  • sharing of data sets
    1. Open data & analysis; (2) tools that aggregate scattered knowledge in papers; (3) open access
  • sharing research data for reproducibility of research results
  • Open access to data, open review
  • The need to advertise your own work and not get lost in the mass of papers that are published every day. In addition, the desire to make data available through repositories (such as a website, data dryad, or figshare)
  • Open access + Publishing of raw and processed data and intermediate results
  • datasets on which publications are based will become available
  • online availability of data accompanying publications
  • Publication in the form of tools, apps to interpret data

8.1.2.5 Outreach

Hopes are to make it easier to advertise and track your own work on social networks like researchgate, academia.edu, twitter, and linked-in. For managing and tracking your outlets see for example growkudos.com, or other online trainings and workshops.

Also the hopes are, in this phase to get support in making complex material easier to fathom by visualisations and animations.

8.1.2.5.1 sample of answers:
  • Profile and Popular
  • Shift to open access publication and the shift from author’s own websites to Research Gate and other networks of that type.
  • Social Media such as ResearchGate
  • The need to advertise your own work and not get lost in the mass of papers that are published every day. In addition, the desire to make data available through repositories (such as a website, data dryad, or figshare)
  • Researchgate as primary tool

8.1.2.6 Assessment

Hopes are to decouple peer-review from the publishing location or traditional journal, this makes a self-published and self-promoted a contribution to your field recognisable as a evaluated and valid part of the scientific corpus. At the same time there is hope that the effort researchers put into rigorous peer-reviewing the work of others gets recognised as well. This can be by open interactive discussion in the comment section, or in dedicated peer-review channels. For solutions in decoupling peer review and getting recognition: have for example a look at publons.com, peerageofscience.org, rubriq.com . For informal open discussions have a look at hypothes.is

For impact hopes are that impact will be not be on the quantity of publications one can crank out, but on the quality of the contribution to science. Earning the recognition could be your contribution on github for your code that gets re-used and forked, or the badges you have earned during different parts of your contributions to your research. See for this openresearchbadges.org, To yet quantify the alternatives for scientific impact, have a look at impactstory.org or altmetric.com

8.1.2.6.1 sample of answers:
  • Review
  • modernizing peer review and publishing beyond topical journals
  • “Open access, open review and open discussions.
  • Projects like Reader which allow much more interaction with papers"
  • A change in how the reviewing process takes place. Some journals have already adopted a hybrid reviewing process that enables different types of submission procedures. I believe this trend will continue in an attempt to make the publication process more objective and less prone to all kinds of unethical behavior in order to conform to publication pressure.
  • online discussions such as on research gate
  • Open access to data, open review
  • interactive comments
  • Grey literature. Pre-print proofs of articles shared on the web, online discussions. So I think that official peer reviewed channels will remain important for evaluations, but less for the dissemination of ideas.
  • Post publication peer review opportunities, because peer review fails
  • Impact
  • I hope a shift away from quantity to quality
  • I am convinced that the absurd focussing on bibliometrics will diminish if not vanish
  • assessment on how to monitor impact

8.2 Open Intention per discipline

9 Tools per discipline

As a quick summary we hafe made a table to show the one most used tool per research phase in each discipline. A more detailed explanation is given in the section below the table, but for the one most used tool we can state the following:

  • Discovery: Most disciplines use Google Scholar to discover new literature. Medicine use PubMeb as their primary source for search. One could say that Lifesciences find having campus access to literature more important than searching for that literature, but in the detail section below we see a more elaborate explanation, where their attention for search is spread between Google Scholar and PubMed.

  • Analysis: MS Excel is the most popular tool for analysis in all disciplines, except for Medicine where they use SPSS.

  • Writing: Here MS Word is the most popular tool for writing in all disciplines, except for Engineering&Technology where they use LaTeX.

  • Publication: Pubishing in Traditional Topical journals is still by far the most popular publication method, despite of the high support for Open Access.

  • Outreach ResearchGate is the most popular platform for profiling your research within the research community, except for two disciplines who use RG slightly less. Engineering&Technology use Google Scholar Citations a bit more, and Arts&Humanities use Academia.edu more.

  • Assessment: Physics, Medicine, Lifesciences and Law use Web of Science for assessment of their research, and the oter disciplines use the Journal Citation Register, which both contain the same impact factor calculated from journals in the ISI database. Internationally there is a lot of debate going on if the merit of an article should count, and not the merit of the journal. Also discussed is the reward sysem to give credit where credit is due.

In the sections below we show the tool usage for each research discipline next to each other. This gives us the opportunity to see if there is a discipline usign a tool more or less than others.

9.1 Discovery

  • For reading articles the majority of the disciplines use pdf, half of them read online in a browser. Also Mendeley is pretty known, but lesser in Law and Arts & Humanities, where they seem to prefer to use iAnnotate instead. ReadCube seems to have a targeted audience for Physics and Engineering. Hypothesis and UtopiaDocs are promising tools young tools and haven’t have a big uptake yet.
  • Google Scholar is overwhelmingly used for searching literature by all disciplines, adding to that for medicine and life sciences pubmed is widely used. World Cat is mostly used by Law and Art & Humanities. From the two competing bibliographic databases Scopus and Web of Science, the later has -not surprisingly because the VU Library has a license- more uptake, especially by Physics and Social sciences and Economics. We are surprised that Scopus is used, where people should get access from elsewhere. Most users from Scopus are in Physics and Engineering. Compared to OECD countries, these countries use Scopus almost a factor three more than VU and VUmc. Mendeley however, also an Elsevier product as Scopus, is used as an alternative to search for literature, mostly by Life sciences.
  • The alert services to discover new literature is less known, or lesser used because of the annoying over abundance of e-mails in the mailbox. To tackle this problem young services like F1000 Prime provides hand picked curated recommendations from senior researchers. And Sparrho uses adaptive algorithms to present only articles relevant for your specific research field from a wide variety of sources.
  • To gain access to literature all disciplines rely on the subscriptions for on-campus access. When not on campus or then journals fall out of the subscription packages alternative methods are used, like ResearchGate, and asking the author directly where the relationships are close. These results are no different compared to other OECD countries. Even the browser plugin Open Access Button is used, mostly in medicine and life sciences, to gain access to toll gated literature, either by searching for the open access alternative / author version, or by finding the e-mail address of the author. Pay per view is rarely used, but also the model for renting articles in services like Deepdyve - a Spotify model for scientific articles - is still an unknown anomaly in scholarly communication. One could wonder which legal alternative will increase, when subscriptions for on campus access end. Discipline Colors LegendLegend for the Discipline Colors

9.2 Analysis

  • Sharing the method or workflow for the analysis is nowadays common practice to be a part of the article. To have a separate platform for sharing the analysis to make the research more easily reproducible is not common practice yet. This might be because there is no honor in reproducing research, but in advancing science with new findings. But these platforms can also be used to pre-register a hypothesis and method. We see in Social science an more familiarity of the Open Science Framework where there is more attention to pre-registration of hypothesis and replication studies. Also in medicine there is a little bit of familiarity with a service like scientific protocols.

    • Other sharing methods mentioned: Evernote, OneNote, Google Keep, Media Wiki, Google Drive, SURFdrive, Apple iCloud, Dropbox, E-mail, Institutional shared folders, Basecamp, GitHub, E-Notebook, eLabjournal.com, ResearchGate, Mindly, Paper, Trello, Design paper, Netherlands Trial Register, project websites, Podio, Clinical trial.com (edit:clinicaltrials.gov)
  • For the analysis Many disciplines use their specific tool for analysis. Excel is the common tool for at least 50 per cent of the Law and Arts&Humanities communities, and even more for the other disciplines. iPhython, R and Matlab are uses mostly by the Physics and Engineering&Technology, where R is also known by Life scientists. And SPSS is the commercial package that is used intensely in medicine, social-science&economics and life sciences. Unknow yet but interesting for digital humanities is the DHbox and R open science both with ready-to-go configurations of computational tools, the first as runtime environment in the cloud with R and iPython, the other an extensive software library for R. In the survey the following tools are mentioned by VU and VUmc researchers in different disciplines. Some of them were mentioned frequently like Nvivo or across disciplines like Atlas.TI.

    • Physics: Fortran, Wolfram Mathematica, Linux, GAMS, ArcGIS, Origin, Gradeprofiler, Python, Java, C++, Comprehensive Meta-analysis (CMA), SAS, Mplus, Galaxy, PQ method, Atlas.TI, open office spreadsheet, Glotaran, R package TIMP
    • Engineering & Technology: Atlas.TI, OpenRefine, Python, Oxmetrics, Semantic web platforms, ACQknowledge, GraphPad, open office spreadsheet, Java, Glotaran, R package TIMP
    • Medicine: MaxQDA, Atlas.TI, ReviewManager, GraphPad Prism, SAS, Mplus, Apple Numbers, Stata, Comprehensive Meta-analysis (CMA), StatView, Review Manager (systematic reviews), FSL, Flowjo, MS Access, Snapgene, Accurri Analysis, instrument specific software, MLWIN, MS Word, Mindmeister, clonemanager, softmax, Vinci, Galaxy, Python, Java, C++, ACQknowledge, openMx, Plink, wolfram mathematica, PQ method, Libre Offic, picture analysis, Statistica, Spike/Signal
    • Social Science & Economics: Atlas.TI, Mplus, STATA, QSR Nvivo, MaxQDA, C++, SAS, Review Manager (systematic reviews), Comprehensive Meta-analysis (CMA), Python, MS Word, Transana, Mlwin, Oxmetrics, QGIS, Amos, Lyx, winedt, JASP, Wolfram Mathematica, Gephi, UCINET, NodeXL, ORA, ConText, Netdraw (social network analysis software), AmCat (Amsterdam Content Analysis Toolkit), Python package Pandas, ArcGIS, MaxQDA, GAMS, SAS, MaxQda, Atlas.TI, Mindmeister, SmartPLS, Dedoose, Lingo software, maxima, EQS, PQ method, MS Access, fs/QCA software, Lisrel.
    • Law: Atlas.TI, MS Word, various text mining tools
    • Arts & Humanities: MS Access, MS Word, MPlus, Atlas.TI, Python, MaxQDA, Concordance software e.g. AntConc, AmCat (Amsterdam Content Analysis Toolkit) Discipline Colors LegendLegend for the Discipline Colors

9.3 Writing

  • MS Word is the most favorite office tool for writing, but Engineering and technology use Google docs and Overleaf more often for collaborative writing than other disciplines.
  • For managing references Endnote is the most popular for all disciplines except in Engineering&Technology, where they prefer Mendeley. Endnote and Mendeley have a a similar user base, except that Mendeley is being used vastly more by younger researchers. Something to look into in the future, but we can imagine that many Phd candidates want something that works right after a download, instead of getting a license token from university administration. Where Mendeley is used lesser by Arst&Humanities, this group does use the open source reference manager Zotero much more than the other disciplines. Discipline Colors LegendLegend for the Discipline Colors

9.4 Publication

  • Despite the fact that all disciplines publish in traditional journals, it are researchers in medicine and life sciences who publish in OA topical- and mega-journals.
  • To decide what journal to submit an article the Journal Citation Ranking is still a leading indicator for most disciplines, except for researchers in Law, where they seem to lead in journal assessement platforms with an open access focus like the directory of open access journals (DOAJ), quality open access market (QOAM), Sherpa/Romeo and Journalysis. Also mentioned was the Eigenfactor.org, advise from supervisors and peers, the metrics from Google Scholar, similarities and reputation from authors in reference lists
  • Most disciplines recognize ResearchGate as a place for archiving and sharing publications. Archiving scientific output to safeguard the corpus for future generations is not as common practice yet for all disciplines, but Physics and Engineering&Technology mostly use arXiv for years to publish prepints as a function to speed-up the scientific process and at the same time claim their finding at that particular date. Other expected patterns that are visible are Lifesciences and Medicine use PubmedCentral, and the institutional repository is known across all disciplines. Strangely SSRN is used by Law a lot more than expected in Social sciences. We expected BioRXiv to be familiar among the Life sciences, but the service just started a few years ago.
  • Although a plentitude of platforms are available to share and archive data, code and presentations, only Github and Bitbucket are mainly used by one discipline; Engineering&Technology. Other suggestions were given to share code, data and presentations: Open Science Framework (OSF), Dropbox, Onedrive, SURFdrive, SURFsara.nl, Institutional shared folder, SPSS, Survey Monkey, External Hard Drive, E-mail, EDUgroepen.nl, Openclinica, Mendeley Data, SVN, GEO, Gitlab, EGA, tranSMART, B2SHARE, own website, supplement of papers, Academia.edu, dedicated repositories, wetransfer. For archiving data in a discipline the best place to start is the Registry of Research Data Repositories at RE3data.org Discipline Colors LegendLegend for the Discipline Colors

9.5 Outreach

  • We see that Engineering & Technology use Google Scholar Citations for researcher profiles more than other disciplines. This might be related to the recent obligation from the faculty of sciences to use this channel as the official outlet of their academic work. Again ResearchGate pops up as a platform that is broadly used to display your work in all disciplines. Only where we see two dips for Arts & Humanities and Law at ResearchGate, we see them re-appear as spikes at Academia.edu. Also we see that these disciplines are less familiar with ORCiD than the other disciplines. The profiles at our own institution are familiar, but we hear complains about the lack of control and speed researches have to influence these pages. That’s why we respondents also mentioned own website a number of times.
  • Twitter is the most popular outlet for mentioning scientific findings to the public by Arts&Humanities and Social&Economics, but also Engineering&Technology like to use to show off their achievements. The same groups like to inform the public by placing infrormation on their work on websites or blogging platforms like Wordpress. In smaller numbers, but fairly distributed across all disciplines improve Wikipedia with their findings. The startup GrowKudos.com, which manages the distribution and measurement the impact of your work on blogs and social media networks (Tw,Fb,Ln), is still very unknown. Other mentions for public outlets were Newspapers, Facebook and Linked-in.
  • To archive posters and presentations is not common practice, but only for Engineering&Technology they use Slideshare, Figshare and Vimeo. Also mentioned was Prezi, Dropbox, Youtube, own website, ResearchGate and Academia.edu. Discipline Colors LegendLegend for the Discipline Colors

9.6 Assessement

  • Using services for peer review organized beyond that by journals is very unknown territory. The common practice is to go with the review process organised by a journal. Decoupling this process makes it possible to validate the research and maintain trust in the scientific findings, but publish and spread it to multiple platforms creating greater reach. Other methods mentioned was discuss with peers.
  • To measure the impact of one’s output, Lifesciences, Medicine, Physics and Social&Economics look at the Journal Citation Reports (JRC) and Web of Science as a reference. Other mentions were Google Scholar citation index and in Social&Economics the Eigenfactor. Also InspireBeta.net in high energy physics is a good example for profile metrics; pre-print/post-print ratio, citation breakdown in clusters, filter on self-citation, publication type, co-authors, keyword frequencies, publication timeline graph, etc. And the ERIM journal list (EJL) from the Erasmus University is a good example of assessment for impact, but scaled to a discipline specific area with their own additional criteria. Discipline Colors LegendLegend for the Discipline Colors

10 Detailed overview for each discipline

In this overview we show the graphs focussed on each discipline in each research phase. Next to these bars we will place additional bars, where you can compare the discipline against the OECD averages for that discipline, and the VU&VUmc average.

10.1 ArtsHumanities (N= 55 )

10.1.1 Discovery

10.1.2 Analysis

10.1.3 Writing

10.1.4 Publication

10.1.5 Outreach

10.1.6 Assessment

10.2 EngineeringTechnology (N= 35 )

10.2.1 Discovery

10.2.2 Analysis

10.2.3 Writing

10.2.4 Publication

10.2.5 Outreach

10.2.6 Assessment

10.3 Law (N= 26 )

10.3.1 Discovery

10.3.2 Analysis

10.3.3 Writing

10.3.4 Publication

10.3.5 Outreach

10.3.6 Assessment

10.4 LifeSciences (N= 144 )

10.4.1 Discovery

10.4.2 Analysis

10.4.3 Writing

10.4.4 Publication

10.4.5 Outreach

10.4.6 Assessment

10.5 Medicine (N= 181 )

10.5.1 Discovery

10.5.2 Analysis

10.5.3 Writing

10.5.4 Publication

10.5.5 Outreach

10.5.6 Assessment

10.6 PhysicalSciences (N= 39 )

10.6.1 Discovery

10.6.2 Analysis

10.6.3 Writing

10.6.4 Publication

10.6.5 Outreach

10.6.6 Assessment

10.7 SocialSciencesEconomics (N= 176 )

10.7.1 Discovery

10.7.2 Analysis

10.7.3 Writing

10.7.4 Publication

10.7.5 Outreach

10.7.6 Assessment